Readers are getting suspicious about AI. So I asked over 30 book editors how they were dealing with the problem of AI-generated book submissions. It took considerable labour to track these editors down in the first place. Most email addresses were hidden from the public, so I tried to reverse-engineer them, making forensic alterations whenever an “unsent” message came through. I tried LinkedIn. I tried company websites, which had become labyrinthine after years of corporate mergers.
In the end, none of them would tell me anything. It was difficult to blame them for staying quiet. Any admission would have led to even more blame, the kind they were already getting in spades on social media. They were taking audiences and advances from the deserving. They were destroying trust. And they were gatekeeping, but not doing so very well. It was a few weeks after one of the biggest literary scandals of the year, and something in publishing had gone very wrong.
This furore started with one book and one protagonist. The setup raised no eyebrows; like most main characters in the semi-feminist “femgore” niche, Gia was “thirty, alone, and unraveling quietly enough that [no one] noticed. Yet.” Other things about Gia were slightly strange. Her tears, for example, felt “different. They are not despair, not hopelessness. They feel like something else entirely—something sharp and terrifying, carving space for whatever comes next.”
Almost everything in Gia’s story was sharp. She knew her competitors on the dating market were “sharper and shinier” than her. When she met a potential sugar daddy to take her out of financial ruin, his laughter was “sharp and jagged.” A paragraph later, his voice was “soft but sharp, a knife wrapped in velvet.” He asked her to act like a dog. A hot shower was “sharp and grounding.” Her shame was “sharp as a knife edge;” her guilt was “sharp and intrusive.”
In a trademark burst of catharsis, Gia turned into a real dog and mauled him to death. “My hands grip the wheel,” she said on her escape, “but they’re not hands anymore—claws, thick and dark, the tips sharp enough to leave faint scratches in the leather…”
When Mia Ballard’s self-published Shy Girl rose to the top of Amazon’s horror chart last spring, several of the internet’s sharper readers smelled a sharp rat. “I recently started this book and I’m so confused by it… there are a lot of weird formatting issues?” one wrote on Reddit in May 2025. “I thought maybe it was intentional, like the main character was losing her mind so her writing became unorganized. But now I’m thinking maybe it’s entirely AI written.”
Nobody seemed to catch Shy Girl’s early reviews, or its strange sentence patterns, or its overuse of the word “sharp,” which popped up over 180 times in total. The canine saga exited the self-publishing world and reached editors at two different imprints without raising any actionable alarm. Wildfire, an imprint of the Hachette-owned Headline Books, bought the novel and released it in the UK last November. It had all the requisite cover quotes and comparisons. It had an Ottessa Moshfegh-esque front cover featuring a glamorous Irish Wolfhound. It was set to become the latest “female rage” sensation. Ballard was gearing up for an American release through Orbit Books, another Hachette imprint, when the accusations finally caught up with her.
It took another lengthy Reddit post and a three-hour YouTube video to bring the writer down. Ballard is the first traditionally published author to face serious career consequences after being accused of AI use. Shy Girl was withdrawn from publication last month. Orbit did not respond to a request for comment.
Industry peers have not taken the fallout well. “We are busting our balls over here, going to school for MFAs, writing many books over many years…” went a sharply-written post on Reddit’s traditional publishing forum, r/PubTips, “and Shy Girl literally was about to go on shelves in the US next month.”
“We’re told constantly this industry is so rigorous, so discerning, and that’s why we have to wait months to hear back, why we have to get rejection after rejection…” commented another hopeful author, “and then this book, which anyone with a brain could tell was written by ChatGPT, gets picked up and ran with and seemingly no one in the process of publishing it noticed or cared.”
Their frustration is understandable. Most first-time novelists must finish a convincing, original manuscript before approaching multiple agents in a lengthy “querying” process. Agents decide whether books are likely to see commercial or literary success. If they sense potential, they’ll help authors through rounds of developmental and line edits. End products “go on submission,” which means they’re finally offered up to editors at publishing houses. Every stage involves waiting, stress, and potential ego death.
In a dynamic that has drawn considerable ire, this system keeps much of the writing public away from the literary establishment. It also seems to have protected the reading public from the worst excesses of AI-generated prose. Two literary agents told me they’d seen an uptick in AI-authored submissions. One, who mostly deals with non-fiction, has spotted a rising “level of repetition and inauthenticity” in his pitch inbox. “If I see something is AI-written, I don’t see a point in representing [it],” he said. “We’re a human-centred industry with human readers.”
It might comfort horror fans to learn that Ballard is an exception. According to early Bookseller coverage, a commissioning editor at Wildfire bought Shy Girl directly from its author. No middleman was involved. Ballard didn’t need to convince anyone the book would be a commercial success, because it was already at the top of the Amazon horror charts. These deals are now industry practice: one agent told me she’d noticed colleagues scanning the subscription platform Kindle Unlimited for promising work. But they are still a relatively rare path to traditional publication. A goal of the established system is to vet authors early on so scandals do not happen. The majority still pass through it. Ballard turned sharply away.
Something must have been going on in the publishing houses. A few days after being spurned by a list of editors, I finally got hold of an interviewee, who could be identified only as a worker in corporate publishing. It wasn’t surprising, she said, that nobody was willing to talk. The industry was already fairly opaque. Its operational ranks certainly were using AI themselves; those employees seemed positive about its everyday utility. There was talk of farming it out to write the blurbs on the back of books, and to do administrative tasks that might once have gone to low-level assistants.
The editorial side had a longstanding understaffing problem. My source said industry layoffs have largely affected managing editors, who oversee the whole publishing process, and copy editors, who examine manuscripts in detail for spelling, grammar and accuracy issues. The Big Five of ten or 15 years ago might have had the resources to catch Ballard before she went to print. The industry is still full of “good readers,” an agent assured me. But its remaining editors are overworked, with less time to handle increased churn. Sometimes they end up skipping steps. While publishing companies once dealt with developmental edits in-house, literary agencies need to hand over nearly publishable work to make a sale.
Precarious economic conditions mean editors are shrinking from potential financial risks.
The industry’s most profitable novelists cluster towards dependable genre groupings in romance, horror and fantasy. Usefully for AI, these niches all revolve around “tropes”; our current crop of romance readers can be persuaded to buy a book for the sake of its common-denominator narrative patterns, like the self-explanatory “enemies to lovers” and the puzzlingly frequent “there’s only one bed.” Some would-be agents and editors list their preferred tropes when soliciting manuscripts. Some authors list them when promoting themselves to readers.
It’s also de rigueur to think in terms of “comps,” or comparative works, when trying to sell a book. Agents and editors need to know there is a potential market for a project before taking it on; book publicists need a built-in target audience. Ballard clearly brainstormed Shy Girl as a portmanteau of several other, more successful novels. This wasn’t an issue for her core audience, who have spent the last few years learning to think of literature as a series of minor derivations. The “femgore” niche had literary-ish roots in Rachel Yoder’s Nightbitch (woman becomes dog) and Eliza Clark’s Boy Parts (woman tortures and kills men for sexual kicks). Eventually enough writers closed in, categorising and stealing each other’s motifs and metaphors, for the scene to metastasise into genre fiction.
Many tropes found their formalised footing in the world of fanfiction, and filtered into traditional publishing around the start of the decade. Comps have been a staple of the publishing process for longer. It just so happens that the resultant landscape is coincidentally easy for AI to lay waste to. LLMs cannot, by definition, express anything in a novel way. They struggle with meter and scansion because they have no internal conception of time. But they are good at cannibalising ideas and imagery, and sometimes at plagiarising on command. In an effort to place safe bets, the book world has accidentally revalued itself in terms of things that ChatGPT can also do.
Some have realised they stand to profit from the asset-stripping of the literary sphere. The Shy Girl scandal has done much to break down institutional trust; these unscrupulous parties will only damage it more. Last week, a mostly incoherent and likely AI-generated op-ed in the Bookseller made the case for a “frank discussion” of AI in publishing. “The industry’s response,” it concluded, “needs to shift from catching undisclosed use to making disclosure ordinary – in contracts, in submissions, in editorial conversations.” The credited author, once editor-in-chief of Penguin’s Indian arm, was trying to sell her own AI editing tool.
My industry source is worried we’ll see a “slow bleed” of AI-generated writing into traditional publishing. The current backlash suggests publishers would need to hide it from us, the way you’d sneak medicine into a cat’s food. They depend on the laundering of legitimacy and prestige, and are on the way to losing both. They will need to staff their offices again. They will need to work out how to detect the vestiges of ChatGPT, making sure no book slips through the cracks. The response must be brisk. And, for that matter, sharp.
[Further reading: AI will dissolve civilisation as we know it]






Join the debate
Subscribe here to comment